3D Tabletop User Interface Using Virtual Elastic Objects Figure 1: 3D Interaction with a virtual elastic object Hiroaki Tateyama Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, hiroaki@ @is.ics.saitama-u.ac.jp Takumi Kusano Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, kusano@ @is.ics.saitama-u.ac.jp Takashi Komuro Graduate School of Science and Engineering, Saitama University 255 Shimo-Okubo, Sakura-ku, Saitama City, Japan, komuro@mail.saitama-u.ac.jp Abstractt In this paper, we propose a method to reduce the inconsistency between virtual and real spaces in manipulating a 3D virtual object with users fingers. When a user tries to hold a virtual object, fingers do not stop on the surface of the object and thrust into the object since virtual objects cannot give reaction force. We therefore try to prevent fingers from thrusting into a virtual object by letting the object deform or glide through the fingers. A virtual object is deformed by using a spring-based model and solving the equation of equilibrium. Whether the object glides through the fingers or not is determinedd by calculating resultant force added to the object and resultant force of static friction when the fingers touch the object. Based on these methods, we constructed a 3D tabletop interface that enables interaction with virtual objects with a greater sense of reality. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-partuses, contact the Owner/Author. Copyright is held by the owner/author(s). ITS '14, Nov 16-19 2014, Dresden, Germany components of this work must be honored. For all other ACM 978-1-4503-2587-5/14/ /11. http://dx.doi.org/10.1145/2669485.2669533 Author Keywords Virtual reality; 3D interaction; object manipulation; vision-based UI. ACM Classification Keywords H.5.2 [Information interfaces and presentation]: User Interface - Interaction styles Introduction With the progress of 3D displaying technology, research of interaction systems that allows users to manipulate 283
stereoscopic 3D virtual objects has been actively conducted. For example, an interaction system that allows the users to move a 3D virtual object on an auto-stereoscopic display by putting their hand close to the table [1], but the users cannot manipulate the object directly. On the other hand, systems that enable direct interaction with a virtual object have also been developed. Yoshida et al. has developed a system that can present 42 different viewpoint images without glasses and that enables interaction with a sense of reality [2]. Interaction with a greater sense of reality in a virtual space has been realized by projecting proper 3D images according to the users real-time viewpoint position and by moving virtual objects according to the users motion following physical laws [3]. In another way, a high-speed stereo camera can be used to enhance a sense of reality in interaction with virtual objects. A system that detects users fingers position with a high-speed stereo camera and that moves a virtual object linked to the position with little latency has been developed [4]. These systems enable interaction with virtual objects using users hand, but interaction is limited to simple manipulations such as pushing. On the other hand, there are systems that realize various manipulations such as holding and lifting [5, 6], which enables flexible and realistic interaction. However, there is a problem that virtual objects cannot give reaction force. When a user tries to hold a virtual object, for example, fingers do not stop on the surface of the object and thrust into the object. This causes inconsistency between the real space and the virtual space, and impairs the sense of reality. A solution for this problem has been proposed in the case of VR systems using a head-mounted display (HMD) [7, 8]. Users can control a virtual hand seen through the HMD with the movement of the users real hand. The systems stop the virtual fingers on the surface of an object before the fingers thrust into the object. However, this method can be used when the users real hand is invisible and only the virtual hand is visible to users. In this paper, we elasticize the object and let it deform or glide through the fingers depending on the force added to the objects to prevent the fingers from thrusting into a virtual object. Thereby, we reduce the inconsistency between virtual and real spaces in manipulating a 3D virtual object with users fingers. Also, we constructed a 3D tabletop interface based on the proposed method which enables interaction with virtual objects with a greater sense of reality. Virtual Object Deformation Virtual object deformation is a method to elasticize a virtual object and to let it deform along the shape of users fingers. In this study, we use cylindrical objects and apply a spring-based model. We consider only the top-view shape of the objects and ignore the height of the objects. Figure 2(a) shows the spring-based model that is used to represent a virtual object. A certain number of control points are placed on the circumference uniformly and each control points are connected to the neighboring points and to the central point with a spring. Since control points that compose a virtual object are closely-arranged when seen locally, they can be regarded as linearly connected points. Figure 2(b) shows approximated control points that are linearly connected. There are two cases when control points move. 284
(a) Cylindrical object (b) Linear approximation Figure 2: Spring-based model Control points are moved by a finger. Control points are moved by forces from neighboring springs. Let be the horizontal position of a control point ( = 1,2 ). When a control point is moved by a finger to the position, is determined by the following equation. = (1) Figure 3 shows the equilibrium of forces when control points are moved by forces from neighboring springs. Let be the natural length of vertical springs, the length of a vertical spring after deformation, the angle of a vertical spring, and, the spring constants of vertical and horizontal springs. The equation of the equilibrium of forces is written as follows. Figure 3: Equilibrium of forces Using the following relations, sin = (3) sin = (4) the equation can be transformed as follows. (2 + ) =0 (5) The positions of all control points can be calculated by solving a system of equations constructed from Eq. (1) and Eq. (5). It is necessary to deform a virtual object along the shape of fingers. Figure 4 shows the flow of the object deformation algorithm. ( sin sin ) + =0 (2) 285
4) Other control points are again moved based on the equation of the equilibrium of forces. 5) By repeating it until all the control points are outside fingers. 6) The result is presented to the user. With this algorithm, users can manipulate virtual objects without the fingers thrusting into the object. Figure 4: Object deformation algorithm The details of the algorithm are as follows. 1) When the system detects that some fingers are inside a virtual object, the nearest control point to each fingertip is moved to the position of the fingertip. 2) Other control points are moved based on the equation of the equilibrium of forces. 3) If the destination of the control points is inside the finger, the points are fixed to the nearest contour of the finger. Virtual Object Gliding Virtual object gliding is a method to let a virtual object glides through the fingers when the object is deformed to some extent. This prevents the object from overdeforming even when fingers thrust into a virtual object deeply. Whether the object glides through the fingers or not is determined by calculating resultant force added to the object and resultant force of static friction when the fingers touch the object. Let be the number of contact points between fingers and the object, the force added to a contact point, the resultant force of, the angle between and, and the friction coefficient. The condition to let the object glides through the fingers is as follows. > cos (6) Using this method, pushing manipulation and holding manipulation can be treated in a unified way. Figure 5 shows the forces applied to a virtual object when =2. 3D Tabletop Interface Based on the proposed methods, we constructed a 3D tabletop interface system that enables interaction with virtual objects. We modified the system developed by 286
Figure 5: Forces applied to a virtual object Kusano et al. [9] to construct our system. The system consists of an upwardly-placed multi-view autostereoscopic display and a camera installed above the display. The appearance of the system is shown in Figure 6. Users can interact with 3D virtual objects which are presented on the display. Hand regions are extracted from a camera image using color information and the system recognizes whether fingers are inside a virtual object. We implemented virtual object deformation and gliding methods to the system. The result of deformation and gliding is illustrated in Figure 7. The processing time was short enough and the system ran at 60 fps in real time. Conclusions In this study, we proposed a method to reduce the inconsistency between virtual and real spaces in manipulating 3D virtual objects. By elasticizing a virtual object, the system prevents users fingers from Figure 6: 3D tabletop interface Figure 7: Object deformation and gliding thrusting into the virtual object. Based on the approach, we constructed a 3D tabletop interface that enables interaction with virtual objects with a greater sense of reality. In future works, we will increase shapes of virtual objects such as cube and triangular pyramid, realize 287
more various operations such as pushing and lifting, and construct practical application systems. References 1. Kobayashi, K., Oikawa, M., Koike, T., Utsugi, K., Yamasaki, M., and Kitagawa, S. Character Interaction System with Autostereoscopic Display and Range Sensor. In Proc. 3DUI 2007, 95-98. 2. Yoshida, T., Kamuro, S., Minamizawa, K., Nii, H., and Tachi, S. RePro3D: Full-parallax 3D Display with Haptic Feedback using Retro-reflective Projection Technology. In Proc. ISVRI 2011, 49-54. 3. Benko, H., Jota, R., and Wilson, A. MirageTable: Freehand Interaction on a Projected Augmented Reality Tabletop. In Proc. CHI 2012, 199-208. 4. Niikura, T., and Komuro, T. 3D Touch Panel Interface Using an Autostereoscopic Display. In Proc. ITS 2012, 295-298. 5. Hilliges, O., Kim, D., Izadi, S., Weiss, M., and Wilson, A. HoloDesk: Direct 3D Interactions with a Situated See-Through Display. In Proc. CHI 2012, 2421-2430. 6. Lee, J., Olwal, A., Ishii, H., and Boulanger, C. SpaceTop: Integrating 2D and Spatial 3D Interactions in a See-through Desktop Environment. In Proc. CHI 2013, 189-192. 7. Prachyabured, M. and Borst, C.W. Dropping the ball: Releasing a virtual grasp. In Proc. 3DUI 2011, 59-66. 8. Burns, E., Razzaque, S., Panter, A.T., Whitton, M.C., McCallus, M.R. and Brooks Jr, F.P. The Hand is Slower than the Eye: A Quantitative Exploration of Visual Dominance over Proprioception. In Proc. VR 2005, 3-10. 9. Kusano, T., Niikura, T., and Komuro, T. A Virtually Tangible 3D Interaction System using an Autostereoscopic Display. In Proc. SUI 2013, 87. 288