ITS '14, Nov , Dresden, Germany

Similar documents
Evaluation of Visuo-haptic Feedback in a 3D Touch Panel Interface

VolGrab: Realizing 3D View Navigation by Aerial Hand Gestures

Localized Space Display

Development of a Finger Mounted Type Haptic Device Using a Plane Approximated to Tangent Plane

Arcaid: Addressing Situation Awareness and Simulator Sickness in a Virtual Reality Pac-Man Game

2 Outline of Ultra-Realistic Communication Research

Investigating Gestures on Elastic Tabletops

Rear-screen and kinesthetic vision 3D manipulator

Do Stereo Display Deficiencies Affect 3D Pointing?

Haptic Media Construction and Utilization of Human-harmonized "Tangible" Information Environment

Tangible Lenses, Touch & Tilt: 3D Interaction with Multiple Displays

Application of 3D Terrain Representation System for Highway Landscape Design

T(ether): spatially-aware handhelds, gestures and proprioception for multi-user 3D modeling and animation

Output Devices - Visual

Development of A Finger Mounted Type Haptic Device Using A Plane Approximated to Tangent Plane

Beyond: collapsible tools and gestures for computational design

Illusion of Surface Changes induced by Tactile and Visual Touch Feedback

Feelable User Interfaces: An Exploration of Non-Visual Tangible User Interfaces

DepthTouch: Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface

The Mixed Reality Book: A New Multimedia Reading Experience

Sublimate: State-Changing Virtual and Physical Rendering to Augment Interaction with Shape Displays

Ungrounded Kinesthetic Pen for Haptic Interaction with Virtual Environments

Virtual Experiments as a Tool for Active Engagement

CSC 2524, Fall 2017 AR/VR Interaction Interface

Air-filled type Immersive Projection Display

A Study of Direction s Impact on Single-Handed Thumb Interaction with Touch-Screen Mobile Phones

Occlusion based Interaction Methods for Tangible Augmented Reality Environments

Future Directions for Augmented Reality. Mark Billinghurst

Image Manipulation Interface using Depth-based Hand Gesture

Figure 1. The game was developed to be played on a large multi-touch tablet and multiple smartphones.

Interactive Multimedia Contents in the IllusionHole

Transporters: Vision & Touch Transitive Widgets for Capacitive Screens

From Room Instrumentation to Device Instrumentation: Assessing an Inertial Measurement Unit for Spatial Awareness

Automated Virtual Observation Therapy

Development of Informal Communication Environment Using Interactive Tiled Display Wall Tetsuro Ogi 1,a, Yu Sakuma 1,b

Augmented Keyboard: a Virtual Keyboard Interface for Smart glasses

Introduction to Virtual Reality (based on a talk by Bill Mark)

Mixed Reality Approach and the Applications using Projection Head Mounted Display

Practical Data Visualization and Virtual Reality. Virtual Reality VR Display Systems. Karljohan Lundin Palmerius

Using Real Objects for Interaction Tasks in Immersive Virtual Environments

LOOKING AHEAD: UE4 VR Roadmap. Nick Whiting Technical Director VR / AR

Wearable Haptic Display to Present Gravity Sensation

HCI Outlook: Tangible and Tabletop Interaction

Using Scalable, Interactive Floor Projection for Production Planning Scenario

Flexible Active Touch Using 2.5D Display Generating Tactile and Force Sensations

VR based HCI Techniques & Application. November 29, 2002

Integrated Driving Aware System in the Real-World: Sensing, Computing and Feedback

Development of Video Chat System Based on Space Sharing and Haptic Communication

Realistic Visual Environment for Immersive Projection Display System

Haptic Reproduction and Interactive Visualization of a Beating Heart Based on Cardiac Morphology

HamsaTouch: Tactile Vision Substitution with Smartphone and Electro-Tactile Display

Touch Feedback in a Head-Mounted Display Virtual Reality through a Kinesthetic Haptic Device

The Control of Avatar Motion Using Hand Gesture

Head-Movement Evaluation for First-Person Games

Beyond Actuated Tangibles: Introducing Robots to Interactive Tabletops

Omni-Directional Catadioptric Acquisition System

SmartCanvas: A Gesture-Driven Intelligent Drawing Desk System

Stereo-based Hand Gesture Tracking and Recognition in Immersive Stereoscopic Displays. Habib Abi-Rached Thursday 17 February 2005.

Multi touch Vector Field Operation for Navigating Multiple Mobile Robots

Comparison of Haptic and Non-Speech Audio Feedback

Evaluation of a Soft-Surfaced Multi-touch Interface

ExTouch: Spatially-aware embodied manipulation of actuated objects mediated by augmented reality

A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems

VR-programming. Fish Tank VR. To drive enhanced virtual reality display setups like. Monitor-based systems Use i.e.

Toward an Augmented Reality System for Violin Learning Support

HMD based VR Service Framework. July Web3D Consortium Kwan-Hee Yoo Chungbuk National University

High-Speed Video Analysis of Two-Dimensional Movement of Objects onto Fine Beads

VIRTUAL FIGURE PRESENTATION USING PRESSURE- SLIPPAGE-GENERATION TACTILE MOUSE

Theory and Practice of Tangible User Interfaces Tuesday, Week 9

Pinch-the-Sky Dome: Freehand Multi-Point Interactions with Immersive Omni-Directional Data

INTERACTION AND SOCIAL ISSUES IN A HUMAN-CENTERED REACTIVE ENVIRONMENT

Virtual Chromatic Percussions Simulated by Pseudo-Haptic and Vibrotactile Feedback

A Gesture-Based Interface for Seamless Communication between Real and Virtual Worlds

Toward Principles for Visual Interaction Design for Communicating Weight by using Pseudo-Haptic Feedback

Test of pan and zoom tools in visual and non-visual audio haptic environments. Magnusson, Charlotte; Gutierrez, Teresa; Rassmus-Gröhn, Kirsten

Subjective Image Quality Assessment of a Wide-view Head Mounted Projective Display with a Semi-transparent Retro-reflective Screen

Column-Parallel Architecture for Line-of-Sight Detection Image Sensor Based on Centroid Calculation

X11 in Virtual Environments ARL

VIRTUAL REALITY Introduction. Emil M. Petriu SITE, University of Ottawa

Robot Control Using Natural Instructions Via Visual and Tactile Sensations

Expression of 2DOF Fingertip Traction with 1DOF Lateral Skin Stretch

Multi-touch Interface for Controlling Multiple Mobile Robots

Navigating the Virtual Environment Using Microsoft Kinect

AUGMENTED REALITY APPLICATIONS USING VISUAL TRACKING

Chapter 2 Introduction to Haptics 2.1 Definition of Haptics

Recognizing Gestures on Projected Button Widgets with an RGB-D Camera Using a CNN

Telecommunication and remote-controlled

Double-side Multi-touch Input for Mobile Devices

Simplifying Remote Collaboration through Spatial Mirroring

Haptics in Remote Collaborative Exercise Systems for Seniors

Information Layout and Interaction on Virtual and Real Rotary Tables

Kissenger: A Kiss Messenger

Interacting within Virtual Worlds (based on talks by Greg Welch and Mark Mine)

The Design of Internet-Based RobotPHONE

EnhancedTable: Supporting a Small Meeting in Ubiquitous and Augmented Environment

CSE 165: 3D User Interaction. Lecture #14: 3D UI Design

Virtual/Augmented Reality (VR/AR) 101

Mohammad Akram Khan 2 India

Using Hands and Feet to Navigate and Manipulate Spatial Data

Geo-Located Content in Virtual and Augmented Reality

Transcription:

3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, hiroaki@ @is.ics.saitama-u.ac.jp Takumi Kusano Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, kusano@ @is.ics.saitama-u.ac.jp Takashi Komuro Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, komuro@mail.saitama-u.ac.jp Abstractt In this paper, we propose a method to reduce the inconsistency between virtual and real spaces in manipulating a 3D virtual object with users fingers. When a user tries to hold a virtual object, fingers do not stop on the surface of the object and thrust into the object since virtual objects cannot give reaction force. We therefore try to prevent fingers from thrusting into a virtual object by letting the object deform or glide through the fingers. A virtual object is deformed by using a spring-based model and solving the equation of equilibrium. Whether the object glides through the fingers or not is determinedd by calculating resultant force added to the object and resultant force of static friction when the fingers touch the object. Based on these methods, we constructed a 3D tabletop interface that enables interaction with virtual objects with a greater sense of reality. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-partuses, contact the Owner/Author. Copyright is held by the owner/author(s). ITS '14, Nov 16-19 2014, Dresden, Germany components of this work must be honored. For all other ACM 978-1-4503-2587-5/14/ /11. http://dx.doi.org/10.1145/2669485.2669533 Author Keywords Virtual reality; 3D interaction; object manipulation; vision-based UI. ACM Classification Keywords H.5.2 [Information interfaces and presentation]: User Interface - Interaction styles Introduction With the progress of 3D displaying technology, research of interaction systems that allows users to manipulate 283

stereoscopic 3D virtual objects has been actively conducted. For example, an interaction system that allows the users to move a 3D virtual object on an auto-stereoscopic display by putting their hand close to the table [1], but the users cannot manipulate the object directly. On the other hand, systems that enable direct interaction with a virtual object have also been developed. Yoshida et al. has developed a system that can present 42 different viewpoint images without glasses and that enables interaction with a sense of reality [2]. Interaction with a greater sense of reality in a virtual space has been realized by projecting proper 3D images according to the users real-time viewpoint position and by moving virtual objects according to the users motion following physical laws [3]. In another way, a high-speed stereo camera can be used to enhance a sense of reality in interaction with virtual objects. A system that detects users fingers position with a high-speed stereo camera and that moves a virtual object linked to the position with little latency has been developed [4]. These systems enable interaction with virtual objects using users hand, but interaction is limited to simple manipulations such as pushing. On the other hand, there are systems that realize various manipulations such as holding and lifting [5, 6], which enables flexible and realistic interaction. However, there is a problem that virtual objects cannot give reaction force. When a user tries to hold a virtual object, for example, fingers do not stop on the surface of the object and thrust into the object. This causes inconsistency between the real space and the virtual space, and impairs the sense of reality. A solution for this problem has been proposed in the case of VR systems using a head-mounted display (HMD) [7, 8]. Users can control a virtual hand seen through the HMD with the movement of the users real hand. The systems stop the virtual fingers on the surface of an object before the fingers thrust into the object. However, this method can be used when the users real hand is invisible and only the virtual hand is visible to users. In this paper, we elasticize the object and let it deform or glide through the fingers depending on the force added to the objects to prevent the fingers from thrusting into a virtual object. Thereby, we reduce the inconsistency between virtual and real spaces in manipulating a 3D virtual object with users fingers. Also, we constructed a 3D tabletop interface based on the proposed method which enables interaction with virtual objects with a greater sense of reality. Virtual Object Deformation Virtual object deformation is a method to elasticize a virtual object and to let it deform along the shape of users fingers. In this study, we use cylindrical objects and apply a spring-based model. We consider only the top-view shape of the objects and ignore the height of the objects. Figure 2(a) shows the spring-based model that is used to represent a virtual object. A certain number of control points are placed on the circumference uniformly and each control points are connected to the neighboring points and to the central point with a spring. Since control points that compose a virtual object are closely-arranged when seen locally, they can be regarded as linearly connected points. Figure 2(b) shows approximated control points that are linearly connected. There are two cases when control points move. 284

(a) Cylindrical object (b) Linear approximation Figure 2: Spring-based model Control points are moved by a finger. Control points are moved by forces from neighboring springs. Let be the horizontal position of a control point ( = 1,2 ). When a control point is moved by a finger to the position, is determined by the following equation. = (1) Figure 3 shows the equilibrium of forces when control points are moved by forces from neighboring springs. Let be the natural length of vertical springs, the length of a vertical spring after deformation, the angle of a vertical spring, and, the spring constants of vertical and horizontal springs. The equation of the equilibrium of forces is written as follows. Figure 3: Equilibrium of forces Using the following relations, sin = (3) sin = (4) the equation can be transformed as follows. (2 + ) =0 (5) The positions of all control points can be calculated by solving a system of equations constructed from Eq. (1) and Eq. (5). It is necessary to deform a virtual object along the shape of fingers. Figure 4 shows the flow of the object deformation algorithm. ( sin sin ) + =0 (2) 285

4) Other control points are again moved based on the equation of the equilibrium of forces. 5) By repeating it until all the control points are outside fingers. 6) The result is presented to the user. With this algorithm, users can manipulate virtual objects without the fingers thrusting into the object. Figure 4: Object deformation algorithm The details of the algorithm are as follows. 1) When the system detects that some fingers are inside a virtual object, the nearest control point to each fingertip is moved to the position of the fingertip. 2) Other control points are moved based on the equation of the equilibrium of forces. 3) If the destination of the control points is inside the finger, the points are fixed to the nearest contour of the finger. Virtual Object Gliding Virtual object gliding is a method to let a virtual object glides through the fingers when the object is deformed to some extent. This prevents the object from overdeforming even when fingers thrust into a virtual object deeply. Whether the object glides through the fingers or not is determined by calculating resultant force added to the object and resultant force of static friction when the fingers touch the object. Let be the number of contact points between fingers and the object, the force added to a contact point, the resultant force of, the angle between and, and the friction coefficient. The condition to let the object glides through the fingers is as follows. > cos (6) Using this method, pushing manipulation and holding manipulation can be treated in a unified way. Figure 5 shows the forces applied to a virtual object when =2. 3D Tabletop Interface Based on the proposed methods, we constructed a 3D tabletop interface system that enables interaction with virtual objects. We modified the system developed by 286

Figure 5: Forces applied to a virtual object Kusano et al. [9] to construct our system. The system consists of an upwardly-placed multi-view autostereoscopic display and a camera installed above the display. The appearance of the system is shown in Figure 6. Users can interact with 3D virtual objects which are presented on the display. Hand regions are extracted from a camera image using color information and the system recognizes whether fingers are inside a virtual object. We implemented virtual object deformation and gliding methods to the system. The result of deformation and gliding is illustrated in Figure 7. The processing time was short enough and the system ran at 60 fps in real time. Conclusions In this study, we proposed a method to reduce the inconsistency between virtual and real spaces in manipulating 3D virtual objects. By elasticizing a virtual object, the system prevents users fingers from Figure 6: 3D tabletop interface Figure 7: Object deformation and gliding thrusting into the virtual object. Based on the approach, we constructed a 3D tabletop interface that enables interaction with virtual objects with a greater sense of reality. In future works, we will increase shapes of virtual objects such as cube and triangular pyramid, realize 287

more various operations such as pushing and lifting, and construct practical application systems. References 1. Kobayashi, K., Oikawa, M., Koike, T., Utsugi, K., Yamasaki, M., and Kitagawa, S. Character Interaction System with Autostereoscopic Display and Range Sensor. In Proc. 3DUI 2007, 95-98. 2. Yoshida, T., Kamuro, S., Minamizawa, K., Nii, H., and Tachi, S. RePro3D: Full-parallax 3D Display with Haptic Feedback using Retro-reflective Projection Technology. In Proc. ISVRI 2011, 49-54. 3. Benko, H., Jota, R., and Wilson, A. MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop. In Proc. CHI 2012, 199-208. 4. Niikura, T., and Komuro, T. 3D Touch Panel Interface Using an Autostereoscopic Display. In Proc. ITS 2012, 295-298. 5. Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. HoloDesk: Direct 3D Interactions with a Situated See-Through Display. In Proc. CHI 2012, 2421-2430. 6. Lee, J., Olwal, A., Ishii, H., and Boulanger, C. SpaceTop: Integrating 2D and Spatial 3D Interactions in a See-through Desktop Environment. In Proc. CHI 2013, 189-192. 7. Prachyabured, M. and Borst, C.W. Dropping the ball: Releasing a virtual grasp. In Proc. 3DUI 2011, 59-66. 8. Burns, E., Razzaque, S., Panter, A.T., Whitton, M.C., McCallus, M.R. and Brooks Jr, F.P. The Hand is Slower than the Eye: A Quantitative Exploration of Visual Dominance over Proprioception. In Proc. VR 2005, 3-10. 9. Kusano, T., Niikura, T., and Komuro, T. A Virtually Tangible 3D Interaction System using an Autostereoscopic Display. In Proc. SUI 2013, 87. 288